Goto

Collaborating Authors

 diffusion process


Online ICA: Understanding Global Dynamics of Nonconvex Optimization via Diffusion Processes

Neural Information Processing Systems

Solving statistical learning problems often involves nonconvex optimization. Despite the empirical success of nonconvex statistical optimization methods, their global dynamics, especially convergence to the desirable local minima, remain less well understood in theory. In this paper, we propose a new analytic paradigm based on diffusion processes to characterize the global dynamics of nonconvex statistical optimization. As a concrete example, we study stochastic gradient descent (SGD) for the tensor decomposition formulation of independent component analysis. In particular, we cast different phases of SGD into diffusion processes, i.e., solutions to stochastic differential equations.


41bacf567aefc61b3076c74d8925128f-Paper.pdf

Neural Information Processing Systems

Hypergraphs are important objects to model ternary or higher-order relations of objects, and haveanumber ofapplications inanalysing manycomplexdatasets occurring in practice.



Diffusion Twigs with Loop Guidance for Conditional Graph Generation

Neural Information Processing Systems

We introduce a novel score-based diffusion framework named Twigs that incorporates multiple co-evolving flows for enriching conditional generation tasks. Specifically, a central or trunk diffusion process is associated with a primary variable (e.g., graph structure), and additional offshoot or stem processes are dedicated